45 research outputs found

    Efficient training algorithms for HMMs using incremental estimation

    Get PDF
    Typically, parameter estimation for a hidden Markov model (HMM) is performed using an expectation-maximization (EM) algorithm with the maximum-likelihood (ML) criterion. The EM algorithm is an iterative scheme that is well-defined and numerically stable, but convergence may require a large number of iterations. For speech recognition systems utilizing large amounts of training material, this results in long training times. This paper presents an incremental estimation approach to speed-up the training of HMMs without any loss of recognition performance. The algorithm selects a subset of data from the training set, updates the model parameters based on the subset, and then iterates the process until convergence of the parameters. The advantage of this approach is a substantial increase in the number of iterations of the EM algorithm per training token, which leads to faster training. In order to achieve reliable estimation from a small fraction of the complete data set at each iteration, two training criteria are studied; ML and maximum a posteriori (MAP) estimation. Experimental results show that the training of the incremental algorithms is substantially faster than the conventional (batch) method and suffers no loss of recognition performance. Furthermore, the incremental MAP based training algorithm improves performance over the batch versio

    Percolation in Models of Thin Film Depositions

    Full text link
    We have studied the percolation behaviour of deposits for different (2+1)-dimensional models of surface layer formation. The mixed model of deposition was used, where particles were deposited selectively according to the random (RD) and ballistic (BD) deposition rules. In the mixed one-component models with deposition of only conducting particles, the mean height of the percolation layer (measured in monolayers) grows continuously from 0.89832 for the pure RD model to 2.605 for the pure RD model, but the percolation transition belong to the same universality class, as in the 2- dimensional random percolation problem. In two- component models with deposition of conducting and isolating particles, the percolation layer height approaches infinity as concentration of the isolating particles becomes higher than some critical value. The crossover from 2d to 3d percolation was observed with increase of the percolation layer height.Comment: 4 pages, 5 figure

    Direct Measurement of the Pseudoscalar Decay Constant fD+

    Full text link
    The absolute branching fraction of D+μ+νD^+ \to \mu^+ \nu has been directly measured by an analysis of a data sample of about 33 pb1{\rm pb^{-1}} collected around s=3.773\sqrt{s}=3.773 GeV with the BES-II at the BEPC. At these energies, DD^- meson is produced in pair as e+eD+De^+e^-\to D^{+} D^{-}. A total of 5321±149±1605321 \pm 149 \pm 160 DD^- mesons are reconstructed from this data set. In the recoil side of the tagged DD^- mesons, 2.67±1.742.67\pm1.74 purely leptonic decay events of D+μ+νD^+ \to \mu^+ \nu are observed. This yields a branching fraction of BF(D+μ+νμ)=(0.1220.053+0.111±0.010)BF(D^+ \to \mu^+ \nu_{\mu}) = (0.122^{+0.111}_{-0.053}\pm 0.010)%, and a corresponding pseudoscalar decay constant fD+=(371119+129±25)f_{D^+}=(371^{+129}_{-119}\pm 25) MeV.Comment: 7 pages, 8 figures, Submitted to Physics Letters B in October, 200

    The huge microphone array

    No full text

    The Huge Microphone Array. 2

    No full text

    Division polynomials and canonical local heights on hyperelliptic Jacobians

    Get PDF
    We generalize the division polynomials of elliptic curves to hyperelliptic Jacobians over the complex numbers. We construct them by using the hyperelliptic sigma function. Using the division polynomial, we describe a condition that a point on the Jacobian is a torsion point. We prove several properties of the division polynomials such as determinantal expressions and recurrence formulas. We also study relations among the sigma function, the division polynomials, and the canonical local height functions
    corecore